Probabilistic error bounds for the discrepancy of mixed sequences

نویسندگان

  • Christoph Aistleitner
  • Markus Hofer
چکیده

In many applications Monte Carlo (MC) sequences or Quasi-Monte Carlo (QMC) sequences are used for numerical integration. In moderate dimensions the QMC method typically yield better results, but its performance significantly falls off in quality if the dimension increases. One class of randomized QMC sequences, which try to combine the advantages of MC and QMC, are so-called mixed sequences, which are constructed by concatenating a d-dimensional QMC sequence and an s− d-dimensional MC sequence to obtain a sequence in dimension s. Ökten, Tuffin and Burago proved probabilistic asymptotic bounds for the discrepancy of mixed sequences, which were refined by Gnewuch. In this paper we use an interval partitioning technique to obtain improved probabilistic bounds for the discrepancy of mixed sequences. By comparing them with lower bounds we show that our results are almost optimal.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On probabilistic results for the discrepancy of a hybrid-Monte Carlo sequence

In many applications it has been observed that hybrid-Monte Carlo sequences perform better than Monte Carlo and quasi-Monte Carlo sequences, especially in difficult problems. For a mixed s-dimensional sequence m, whose elements are vectors obtained by concatenating d-dimensional vectors from a low-discrepancy sequence q with (s−d)-dimensional random vectors, probabilistic upper bounds for its s...

متن کامل

ACTA UNIVERSITATIS APULENSIS No 18/2009 ON G-DISCREPANCY AND MIXED MONTE CARLO AND QUASI-MONTE CARLO SEQUENCES

The G-star and the G-discrepancy are generalizations of the well known star and the extreme discrepancy; thereby G denotes a given continuous distribution function on the d-dimensional unit cube [0, 1]. We list and prove some results that describe the behavior of the G-star and the Gdiscrepancy in terms of the dimension d and the number of sample points N . Our main focus is on so-called mixed ...

متن کامل

On G-Discrepancy and Mixed Monte Carlo and Quasi-Monte Carlo Sequences

The G-star and the G-discrepancy are generalizations of the well known star and the extreme discrepancy; thereby G denotes a given continuous distribution function on the d-dimensional unit cube [0, 1]d. We list and prove some results that describe the behavior of the G-star and the G-discrepancy in terms of the dimension d and the number of sample points N . Our main focus is on so-called mixe...

متن کامل

Nuclear Discrepancy for Active Learning

Active learning algorithms propose which unlabeled objects should be queried for their labels to improve a predictive model the most. We study active learners that minimize generalization bounds and uncover relationships between these bounds that lead to an improved approach to active learning. In particular we show the relation between the bound of the state-of-the-art Maximum Mean Discrepancy...

متن کامل

Probabilistic Lower Bounds for the Discrepancy of Latin Hypercube Samples

We provide probabilistic lower bounds for the star discrepancy of Latin hypercube samples. These bounds are sharp in the sense that they match the recent probabilistic upper bounds for the star discrepancy of Latin hypercube samples proved in [M. Gnewuch, N. Hebbinghaus. Discrepancy bounds for a class of negatively dependent random points including Latin hypercube samples. Preprint 2016.]. Toge...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Monte Carlo Meth. and Appl.

دوره 18  شماره 

صفحات  -

تاریخ انتشار 2012